Upgrading ILP Rules to First-Order Bayesian Networks
نویسندگان
چکیده
Inductive Logic Programming (ILP) is an efficient technique for relational data mining, but when ILP is applied in imperfect domains, the rules induced by ILP often struggle with the overfitting problem. This paper proposes a method to learn first-order Bayesian network (FOBN) which can handle imperfect data powerfully. Due to a high computation cost for directly learning FOBN, we adapt an ILP and a Bayesian network learner to construct FOBN. We propose a feature extraction algorithm to generate features from ILP rules, and use these features as the main structure of the FOBN. We also propose a propositionalisation algorithm for translating the original data into the single table format to learn the remaining parts of the FOBN structure and its conditional probability tables by a standard Bayesian network learner.
منابع مشابه
Learning Directed Probabilistic Logical Models Using Ordering-Search
There is an increasing interest in upgrading Bayesian networks to the relational case, resulting in so-called directed probabilistic logical models. In this paper we discuss how to learn non-recursive directed probabilistic logical models from relational data. This problem has already been tackled before by upgrading the structure-search algorithm for learning Bayesian networks. In this paper w...
متن کاملExtending Bayesian Logic Programs for Plan Recognition and Machine Reading
Statistical relational learning (SRL) is the area of machine learning that integrates both first-order logic and probabilistic graphical models. The advantage of these formalisms is that they can handle both uncertainty and structured/relational data. As a result, they are widely used in domains like social network analysis, biological data analysis, and natural language processing. Bayesian Lo...
متن کاملBalios - The Engine for Bayesian Logic Programs
Inductive Logic Programming (ILP) [4] combines techniques from machine learning with the representation of logic programming. It aims at inducing logical clauses, i.e, general rules from specific observations and background knowledge. Because of focusing on logical clauses, traditional ILP systems do not model uncertainty explicitly. On the other hand, state-of-the-art probabilistic models such...
متن کاملMachine Learning of Bayesian Networks Using Constraint Programming
Bayesian networks are a widely used graphical model with diverse applications in knowledge discovery, classification, prediction, and control. Learning a Bayesian network from discrete data can be cast as a combinatorial optimization problem and there has been much previous work on applying optimization techniques including proposals based on ILP, A* search, depth-first branchand-bound (BnB) se...
متن کاملTowards Learning Non-recursive LPADs by Transforming Them into Bayesian Networks
Logic programs with annotated disjunctions, or LPADs, are an elegant knowledge representation formalism that can be used to combine first order logical and probabilistic inference. While LPADs can be written manually, one can also consider the question of how to learn them from data. Methods for learning restricted classes of LPADs have been proposed before, but the problem of learning any kind...
متن کامل